AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low-energy LLM

# Low-energy LLM

Bitnet B1.58 2B 4T
MIT
The first open-source 2-billion-parameter native 1-bit large language model developed by Microsoft Research, trained on 4 trillion tokens, demonstrating that native 1-bit large language models can significantly improve computational efficiency while maintaining performance comparable to full-precision open-source models of the same scale.
Large Language Model Transformers English
B
microsoft
35.87k
846
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase